The Minimum Number of Hidden Neurons Does Not Necessarily Provide the Best Generalization

نویسنده

  • Jason M. Kinser
چکیده

The quality of a feedforward neural network that allows it to associate data not used in training is called generalization. A common method of creating the desired network is for the user to select the network architecture (largely based on the selecting the number of hidden neurons) and allowing a training algorithm to evolve the synaptic weights between the neurons. A popular belief is that the network with the fewest number of hidden neurons that correctly learns a sufficient training set is a network with better generalization. This paper will contradict this belief. The optimization of generalization requires that the network not assume information that does not exist in the training data. Unfortunately, a network with the minimum number of hidden neurons may require assumptions of information that does not exist. The network then skews the surface that maps the input space to the output space in order to accommodate the minimum architecture which then sacrifices generalization.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimization of Oleuropein Extraction from Olive Leaves using Artificial Neural Network

In this work, the artificial neural networks (ANN) technology was applied to the simulation of oleuropein extraction process. For this technology, a 3-layer network structure is applied, and the operation factors such as  amount  of  flow  intensity  ratio,  temperature,  residence  time,  and  pH  are  used  as  input  variables  of  the network,  whereas  the  extraction  yield  is  considere...

متن کامل

Ensemble strategies to build neural network to facilitate decision making

There are three major strategies to form neural network ensembles. The simplest one is the Cross Validation strategy in which all members are trained with the same training data. Bagging and boosting strategies pro-duce perturbed sample from training data. This paper provides an ideal model based on two important factors: activation function and number of neurons in the hidden layer and based u...

متن کامل

Modeling heat transfer of non-Newtonian nanofluids using hybrid ANN-Metaheuristic optimization algorithm

An optimal artificial neural network (ANN) has been developed to predict the Nusselt number of non-Newtonian nanofluids. The resulting ANN is a multi-layer perceptron with two hidden layers consisting of six and nine neurons, respectively. The tangent sigmoid transfer function is the best for both hidden layers and the linear transfer function is the best transfer function for the output layer....

متن کامل

An Effective EOS Based Modeling Procedure for Minimum Miscibility Pressure in Miscible Gas Injection

The measurement of the minimum miscibility pressure (MMP) is one of the most important steps in the project design of miscible gas injection for which several experimental and modeling methods have been proposed. On the other hand, the standard procedure for compositional studies of miscible gas injection process is the regression of EOS to the conventional PVT tests. Moreover, this procedure d...

متن کامل

The effect of prenatal restraint stress on the number and size of neurons in the rat hippocampal subdivisions

Animal studies have shown that prenatal stress is able to induce long-lasting neurobiological and behavioral alterations in adult offspring. In spite of the facts that hippocampus is sensitive to early developmental influences and its known functional importance in learning and memory, few data are available on the effect of prenatal stress on the structure of hippocampus. Therefore, this study...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001